A data-driven approach to neural architecture search initialization

نویسندگان

چکیده

Abstract Algorithmic design in neural architecture search (NAS) has received a lot of attention, aiming to improve performance and reduce computational cost. Despite the great advances made, few authors have proposed tailor initialization techniques for NAS. However, literature shows that good initial set solutions facilitates finding optima. Therefore, this study, we propose data-driven technique initialize population-based NAS algorithm. First, perform calibrated clustering analysis space, second, extract centroids use them We benchmark our approach against random Latin hypercube sampling using three algorithms, namely genetic algorithm, an evolutionary aging evolution, on CIFAR-10. More specifically, NAS-Bench-101 leverage availability benchmarks. The results show compared sampling, enables achieving significant long-term improvements two baselines, sometimes various scenarios (various training budget). Besides, also investigate how population gathered tabular can be used improving another dataset, So2Sat LCZ-42. Our similar target despite limited budget. Moreover, analyse distributions obtained find provided by retrieving local optima (maxima) high fitness configurations.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

a new approach to credibility premium for zero-inflated poisson models for panel data

هدف اصلی از این تحقیق به دست آوردن و مقایسه حق بیمه باورمندی در مدل های شمارشی گزارش نشده برای داده های طولی می باشد. در این تحقیق حق بیمه های پبش گویی بر اساس توابع ضرر مربع خطا و نمایی محاسبه شده و با هم مقایسه می شود. تمایل به گرفتن پاداش و جایزه یکی از دلایل مهم برای گزارش ندادن تصادفات می باشد و افراد برای استفاده از تخفیف اغلب از گزارش تصادفات با هزینه پائین خودداری می کنند، در این تحقیق ...

15 صفحه اول

Feedforward Neural Network Initialization: an Evolutionary Approach

The initial set of weights to be used in supervised learning for multilayer neural networks has a strong influence in the learning speed and in the quality of the solution obtained after convergence. An inadequate initial choice of the weight values may cause the training process to get stuck in a poor local minimum or to face abnormal numerical problems. Nowadays, there are several proposed te...

متن کامل

Progressive Neural Architecture Search

We propose a method for learning CNN structures that is more efficient than previous approaches: instead of using reinforcement learning (RL) or genetic algorithms (GA), we use a sequential model-based optimization (SMBO) strategy, in which we search for architectures in order of increasing complexity, while simultaneously learning a surrogate function to guide the search, similar to A* search....

متن کامل

A Functional Architecture Approach to Neural Systems

The technology for the design of systems to perform extremely complex combinations of real-time functionality has developed over a long period. This technology is based on the use of a hardware architecture with a physical separation into memory and processing, and a software architecture which divides functionality into a disciplined hierarchy of software components which exchange unambiguous ...

متن کامل

An Enterprise Architecture Driven Approach to Virtualisation

Organisations have shown a significant interest in the adoption of virtualisation technology for improving the efficiency of their Data Centres (DC) from both the resource performance and cost efficiency viewpoints. By improving the efficiency of data centres we can sustainably manage their impact on the environment by controlling their energy consumption. The intentions are clear but how best ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Mathematics and Artificial Intelligence

سال: 2023

ISSN: ['1573-7470', '1012-2443']

DOI: https://doi.org/10.1007/s10472-022-09823-0